186 research outputs found

    Gravitational Wave Chirp Search: Economization of PN Matched Filter Bank via Cardinal Interpolation

    Full text link
    The final inspiral phase in the evolution of a compact binary consisting of black holes and/or neutron stars is among the most probable events that a network of ground-based interferometric gravitational wave detectors is likely to observe. Gravitational radiation emitted during this phase will have to be dug out of noise by matched-filtering (correlating) the detector output with a bank of several 10510^5 templates, making the computational resources required quite demanding, though not formidable. We propose an interpolation method for evaluating the correlation between template waveforms and the detector output and show that the method is effective in substantially reducing the number of templates required. Indeed, the number of templates needed could be a factor ∌4\sim 4 smaller than required by the usual approach, when the minimal overlap between the template bank and an arbitrary signal (the so-called {\it minimal match}) is 0.97. The method is amenable to easy implementation, and the various detector projects might benefit by adopting it to reduce the computational costs of inspiraling neutron star and black hole binary search.Comment: scheduled for publicatin on Phys. Rev. D 6

    Extending Science Gateway Frameworks to Support Big Data Applications in the Cloud

    Get PDF
    Cloud computing offers massive scalability and elasticity required by many scientific and commercial applications. Combining the computational and data handling capabilities of clouds with parallel processing also has the potential to tackle Big Data problems efficiently. Science gateway frameworks and workflow systems enable application developers to implement complex applications and make these available for end-users via simple graphical user interfaces. The integration of such frameworks with Big Data processing tools on the cloud opens new oppor-tunities for application developers. This paper investigates how workflow sys-tems and science gateways can be extended with Big Data processing capabilities. A generic approach based on infrastructure aware workflows is suggested and a proof of concept is implemented based on the WS-PGRADE/gUSE science gateway framework and its integration with the Hadoop parallel data processing solution based on the MapReduce paradigm in the cloud. The provided analysis demonstrates that the methods described to integrate Big Data processing with workflows and science gateways work well in different cloud infrastructures and application scenarios, and can be used to create massively parallel applications for scientific analysis of Big Data

    The New Deal: jeopardised by the geography of unemployment?

    Get PDF
    The New Deal is the Labour government's flagship programme to "end the tragic waste of youth and long-term unemployment" by getting people off welfare benefits and into work. This paper argues that the principal weakness of the New Deal is that it seeks to influence the character of labour supply (i.e. the motivation and skills of the unemployed) while neglecting the state of labour demand, which varies greatly between places. The uneven geography of unemployment in the UK is likely to have a crucial bearing on the programme's impact and effectiveness, but this has been largely ignored in its development. The paper outlines some of the practical consequences of this imbalance and suggests how it could be rectified for the programme to be more effective

    Should cities hosting mass gatherings invest in public health surveillance and planning? Reflections from a decade of mass gatherings in Sydney, Australia

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Mass gatherings have been defined by the World Health Organisation as "events attended by a sufficient number of people to strain the planning and response resources of a community, state or nation". This paper explores the public health response to mass gatherings in Sydney, the factors that influenced the extent of deployment of resources and the utility of planning for mass gatherings as a preparedness exercise for other health emergencies.</p> <p>Discussion</p> <p>Not all mass gatherings of people require enhanced surveillance and additional response. The main drivers of extensive public health planning for mass gatherings reflect geographical spread, number of international visitors, event duration and political and religious considerations. In these instances, the implementation of a formal risk assessment prior to the event with ongoing daily review is important in identifying public health hazards.</p> <p>Developing and utilising event-specific surveillance to provide early-warning systems that address the specific risks identified through the risk assessment process are essential. The extent to which additional resources are required will vary and depend on the current level of surveillance infrastructure.</p> <p>Planning the public health response is the third step in preparing for mass gatherings. If the existing public health workforce has been regularly trained in emergency response procedures then far less effort and resources will be needed to prepare for each mass gathering event. The use of formal emergency management structures and co-location of surveillance and planning operational teams during events facilitates timely communication and action.</p> <p>Summary</p> <p>One-off mass gathering events can provide a catalyst for innovation and engagement and result in opportunities for ongoing public health planning, training and surveillance enhancements that outlasted each event.</p

    Some methods for blindfolded record linkage

    Get PDF
    BACKGROUND: The linkage of records which refer to the same entity in separate data collections is a common requirement in public health and biomedical research. Traditionally, record linkage techniques have required that all the identifying data in which links are sought be revealed to at least one party, often a third party. This necessarily invades personal privacy and requires complete trust in the intentions of that party and their ability to maintain security and confidentiality. Dusserre, Quantin, Bouzelat and colleagues have demonstrated that it is possible to use secure one-way hash transformations to carry out follow-up epidemiological studies without any party having to reveal identifying information about any of the subjects – a technique which we refer to as "blindfolded record linkage". A limitation of their method is that only exact comparisons of values are possible, although phonetic encoding of names and other strings can be used to allow for some types of typographical variation and data errors. METHODS: A method is described which permits the calculation of a general similarity measure, the n-gram score, without having to reveal the data being compared, albeit at some cost in computation and data communication. This method can be combined with public key cryptography and automatic estimation of linkage model parameters to create an overall system for blindfolded record linkage. RESULTS: The system described offers good protection against misdeeds or security failures by any one party, but remains vulnerable to collusion between or simultaneous compromise of two or more parties involved in the linkage operation. In order to reduce the likelihood of this, the use of last-minute allocation of tasks to substitutable servers is proposed. Proof-of-concept computer programmes written in the Python programming language are provided to illustrate the similarity comparison protocol. CONCLUSION: Although the protocols described in this paper are not unconditionally secure, they do suggest the feasibility, with the aid of modern cryptographic techniques and high speed communication networks, of a general purpose probabilistic record linkage system which permits record linkage studies to be carried out with negligible risk of invasion of personal privacy

    A study on the natural history of scanning behaviour in patients with visual field defects after stroke

    Get PDF
    BACKGROUND: A visual field defect (VFD) is a common consequence of stroke with a detrimental effect upon the survivors' functional ability and quality of life. The identification of effective treatments for VFD is a key priority relating to life post-stroke. Understanding the natural evolution of scanning compensation over time may have important ramifications for the development of efficacious therapies. The study aims to unravel the natural history of visual scanning behaviour in patients with VFD. The assessment of scanning patterns in the acute to chronic stages of stroke will reveal who does and does not learn to compensate for vision loss. METHODS/DESIGN: Eye-tracking glasses are used to delineate eye movements in a cohort of 100 stroke patients immediately after stroke, and additionally at 6 and 12 months post-stroke. The longitudinal study will assess eye movements in static (sitting) and dynamic (walking) conditions. The primary outcome constitutes the change of lateral eye movements from the acute to chronic stages of stroke. Secondary outcomes include changes of lateral eye movements over time as a function of subgroup characteristics, such as side of VFD, stroke location, stroke severity and cognitive functioning. DISCUSSION: The longitudinal comparison of patients who do and do not learn compensatory scanning techniques may reveal important prognostic markers of natural recovery. Importantly, it may also help to determine the most effective treatment window for visual rehabilitation.Tobias Loetscher, Celia Chen, Sophie Wignall, Andreas Bulling, Sabrina Hoppe, Owen Churches, Nicole A Thomas, Michael E R Nicholls and Andrew Le

    A proposed architecture and method of operation for improving the protection of privacy and confidentiality in disease registers

    Get PDF
    BACKGROUND: Disease registers aim to collect information about all instances of a disease or condition in a defined population of individuals. Traditionally methods of operating disease registers have required that notifications of cases be identified by unique identifiers such as social security number or national identification number, or by ensembles of non-unique identifying data items, such as name, sex and date of birth. However, growing concern over the privacy and confidentiality aspects of disease registers may hinder their future operation. Technical solutions to these legitimate concerns are needed. DISCUSSION: An alternative method of operation is proposed which involves splitting the personal identifiers from the medical details at the source of notification, and separately encrypting each part using asymmetrical (public key) cryptographic methods. The identifying information is sent to a single Population Register, and the medical details to the relevant disease register. The Population Register uses probabilistic record linkage to assign a unique personal identification (UPI) number to each person notified to it, although not necessarily everyone in the entire population. This UPI is shared only with a single trusted third party whose sole function is to translate between this UPI and separate series of personal identification numbers which are specific to each disease register. SUMMARY: The system proposed would significantly improve the protection of privacy and confidentiality, while still allowing the efficient linkage of records between disease registers, under the control and supervision of the trusted third party and independent ethics committees. The proposed architecture could accommodate genetic databases and tissue banks as well as a wide range of other health and social data collections. It is important that proposals such as this are subject to widespread scrutiny by information security experts, researchers and interested members of the general public, alike

    Mortality after admission for acute myocardial infarction in Aboriginal and non-Aboriginal people in New South Wales, Australia: a multilevel data linkage study

    Get PDF
    Background - Heart disease is a leading cause of the gap in burden of disease between Aboriginal and non-Aboriginal Australians. Our study investigated short- and long-term mortality after admission for Aboriginal and non-Aboriginal people admitted with acute myocardial infarction (AMI) to public hospitals in New South Wales, Australia, and examined the impact of the hospital of admission on outcomes. Methods - Admission records were linked to mortality records for 60047 patients aged 25–84 years admitted with a diagnosis of AMI between July 2001 and December 2008. Multilevel logistic regression was used to estimate adjusted odds ratios (AOR) for 30- and 365-day all-cause mortality. Results - Aboriginal patients admitted with an AMI were younger than non-Aboriginal patients, and more likely to be admitted to lower volume, remote hospitals without on-site angiography. Adjusting for age, sex, year and hospital, Aboriginal patients had a similar 30-day mortality risk to non-Aboriginal patients (AOR: 1.07; 95% CI 0.83-1.37) but a higher risk of dying within 365 days (AOR: 1.34; 95% CI 1.10-1.63). The latter difference did not persist after adjustment for comorbid conditions (AOR: 1.12; 95% CI 0.91-1.38). Patients admitted to more remote hospitals, those with lower patient volume and those without on-site angiography had increased risk of short and long-term mortality regardless of Aboriginal status. Conclusions - Improving access to larger hospitals and those with specialist cardiac facilities could improve outcomes following AMI for all patients. However, major efforts to boost primary and secondary prevention of AMI are required to reduce the mortality gap between Aboriginal and non-Aboriginal people

    Pedagogical approaches for e-assessment with authentication and authorship verification in Higher Education

    Get PDF
    Checking the identity of students and authorship of their online submissions is a major concern in Higher Education due to the increasing amount of plagiarism and cheating using the Internet. The literature on the effects of e-authentication systems for teaching staff is very limited because it is a novel procedure for them. A considerable gap is to understand teaching staff’ views regarding the use of e-authentication instruments and how they impact trust in e-assessment. This mixed-method study examines the concerns and practices of 108 teaching staff who used the TeSLA - Adaptive Trust-based e-Assessment System in six countries: UK, Spain, Netherlands, Bulgaria, Finland and Turkey. The findings revealed some technological, organisational and pedagogical issues related to accessibility, security, privacy and e-assessment design and feedback. Recommendations are to provide: a FAQ and an audit report with results, to raise awareness about data security and privacy, to develop policies and guidelines about fraud detection and prevention, e-assessment best practices and course team support
    • 

    corecore